Convergence rate of incremental aggregated gradient algorithms
نویسنده
چکیده
Motivated by applications to distributed asynchronous optimization and large-scale data processing, we analyze the incremental aggregated gradient method for minimizing a sum of strongly convex functions from a novel perspective, simplifying the global convergence proofs considerably and proving a linear rate result. We also consider an aggregated method with momentum and show its linear convergence. We conclude by discussing extensions of our results to the problems with an additional convex, possibly non-smooth function.
منابع مشابه
Proximal-Like Incremental Aggregated Gradient Method with Linear Convergence under Bregman Distance Growth Conditions
We introduce a unified algorithmic framework, called proximal-like incremental aggregated gradient (PLIAG) method, for minimizing the sum of smooth convex component functions and a proper closed convex regularization function that is possibly non-smooth and extendedvalued, with an additional abstract feasible set whose geometry can be captured by using the domain of a Legendre function. The PLI...
متن کاملIncremental Aggregated Proximal and Augmented Lagrangian Algorithms
We consider minimization of the sum of a large number of convex functions, and we propose an incremental aggregated version of the proximal algorithm, which bears similarity to the incremental aggregated gradient and subgradient methods that have received a lot of recent attention. Under cost function differentiability and strong convexity assumptions, we show linear convergence for a sufficien...
متن کاملOn the Convergence Rate of Incremental Aggregated Gradient Algorithms
Motivated by applications to distributed optimization over networks and large-scale data processing in machine learning, we analyze the deterministic incremental aggregated gradient method for minimizing a finite sum of smooth functions where the sum is strongly convex. This method processes the functions one at a time in a deterministic order and incorporates a memory of previous gradient valu...
متن کاملAnalysis and Implementation of an Asynchronous Optimization Algorithm for the Parameter Server
This paper presents an asynchronous incremental aggregated gradient algorithm and its implementation in a parameter server framework for solving regularized optimization problems. The algorithm can handle both general convex (possibly non-smooth) regularizers and general convex constraints. When the empirical data loss is strongly convex, we establish linear convergence rate, give explicit expr...
متن کاملLinear Convergence of Proximal Incremental Aggregated Gradient Methods under Quadratic Growth Condition
Under the strongly convex assumption, several recent works studied the global linear convergence rate of the proximal incremental aggregated gradient (PIAG) method for minimizing the sum of a large number of smooth component functions and a non-smooth convex function. In this paper, under the quadratic growth condition–a strictly weaker condition than the strongly convex assumption, we derive a...
متن کامل